Invariant Information Bottleneck for Domain Generalization

نویسندگان

چکیده

Invariant risk minimization (IRM) has recently emerged as a promising alternative for domain generalization. Nevertheless, the loss function is difficult to optimize nonlinear classifiers and original optimization objective could fail when pseudo-invariant features geometric skews exist. Inspired by IRM, in this paper we propose novel formulation generalization, dubbed invariant information bottleneck (IIB). IIB aims at minimizing risks simultaneously mitigating impact of skews. Specifically, first present causal prediction via mutual information. Then adopt variational develop tractable classifiers. To overcome failure modes minimize between inputs corresponding representations. significantly outperforms IRM on synthetic datasets, where occur, showing effectiveness proposed overcoming IRM. Furthermore, experiments DomainBed show that 13 baselines 0.9% average across 7 real datasets.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning and Generalization with the Information Bottleneck

The Information Bottleneck is an information theoretic framework that finds concise representations for an ‘input’ random variable that are as relevant as possible for an ‘output’ random variable. This framework has been used successfully in various supervised and unsupervised applications. However, its learning theoretic properties and justification remained unclear as it differs from standard...

متن کامل

Domain Generalization via Invariant Feature Representation

This paper investigates domain generalization: How to take knowledge acquired from an arbitrary number of related domains and apply it to previously unseen domains? We propose Domain-Invariant Component Analysis (DICA), a kernel-based optimization algorithm that learns an invariant transformation by minimizing the dissimilarity across domains, whilst preserving the functional relationship betwe...

متن کامل

Information Bottleneck Domain Adaptation with Privileged Information for Visual Recognition

We address the unsupervised domain adaptation problem for visual recognition when an auxiliary data view is available during training. This is important because it allows improving the training of visual classifiers on a new target visual domain when paired additional source data is cheaply available. This is the case when we learn from a source of RGB plus depth data, for then test on a new RG...

متن کامل

Generalization of Linear Shift Invariant System in the Fractional Domain

Fractional Fourier transform is one of a flourishing field of active research due to its wide range of applications. It is well-known that fractional Fourier transform is linear, but not shift invariant as that of conventional Fourier transform. Linear shift invariant systems can be expressed in terms of convolution of two functions. Convolution for fractional Fourier transform, defined by Alme...

متن کامل

An Information-Theoretic Discussion of Convolutional Bottleneck Features for Robust Speech Recognition

Convolutional Neural Networks (CNNs) have been shown their performance in speech recognition systems for extracting features, and also acoustic modeling. In addition, CNNs have been used for robust speech recognition and competitive results have been reported. Convolutive Bottleneck Network (CBN) is a kind of CNNs which has a bottleneck layer among its fully connected layers. The bottleneck fea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i7.20703